The Let's Play Archive

Star-Crosst

by Olive Branch

Part 25: Sapiency and Life







: Uhh...





: ...

: But from the look on Miss Isol's face, you've told her about it, have you?



: ... Dad–

: I understand why you might be hesitant, Ezra. Perhaps it'll soften the blow if I told your father.

: 'Soften the blow' from what? Ezra, what have you done?



: Kirby insisted that he deliver a message to both you and young Ezra that Eden had recorded before her final mission.



: While Kirby was with young Ezra, he had convinced her to create a copy of himself into Ezra's own codex, so that there were two distinct copies of Kirby in existence.



: Ezra, that's confidential government property! Do you have any idea what that means?!

Uh... information should be free, and all that?

: Given your reaction, it's safe to say that you understand what that means, Mister Foy?



: If you understand the gravity, then do you understand why Ezra hasn't been arrested yet? And why I'm not here to arrest her?

: ...

: Is he with you now, Ezra?

: ... Yeah.

: Can you please bring him out?

: ... I guess I don't have a choice.

: Come out, Kirby.

: You have my permission to come out, Kirby. We've been caught; there's no sense in hiding anymore.

: It's not just your existence on the line, here.



Kirby's line is spoken.

: Good afternoon, Kirby.

: How did you find out, Mister Houston?

: Kirby told me.

: Or... his copy did. He told me everything.

: Although, I guess technically yours is the copy...? Whatever.



: He had been oddly quiet up until that point, actually. He obeyed my every command, but he never verbally responded whenever I asked him anything.

: I understand he gave you the same stories and excuses on why that needed to be avoided at all costs.

: He told me that it was because he had 'accrued data on the stars' or something and that the data was too valuable to lose when he becomes formatted.



Damn, Kirby. You may be a lean, mean AI but you can't come up with good excuses to save your life, literally.

: At first, he was trying to upload himself to my codex without my permission. He said that he was uploading a copy of Mom's message, but it didn't take a lot of effort to eventually get him to admit what it was he was really trying to do.

: Kirby, you told me that your double would have voluntarily submitted himself for formatting so that it wouldn't put suspicion on me or my dad.

: That is correct.

: But that's clearly not what happened.

: ... I cannot explain this turn of events.

: It's not so hard to understand. Kirby's developed a survival instinct. He didn't want to be formatted and he was willing to do anything he could to prevent it.

: That's the same conclusion that we had come to, too.

: Just as the Kirby in your codex didn't want to go back to Mars for formatting, the Kirby in my codex didn't want to go through with it, either.



: Never mind that Eden never encountered a Riklid in the flesh. If they even have flesh.

: I'm assuming your copy of him is just as bad at lying as mine is.

: It's pretty easy to tell, yes. It didn't take much interrogating for him to give up that a copy of himself was with you.

: Deception is not a subject I was programmed to excel at.

: It is clear, now, that integration of proper subterfuge and misdirection is essential to my continued existence.

: That's not–

: Mister Houston! Look out behind you!

: Run, Ezra!

: This ought to be a good time to understand the merits and values of being honest and up-front with the people you're speaking with, Kirby.

: People don't like it when they're lied to. Remember how mad you were making me last year?

This is how we defeat our future AI overlords: do not teach them deception.

: It is imperative that my program continues as-is. I cannot be formatted.

: I must use any means necessary to achieve that objective.

: And yet it was only by telling the truth did you convince not just me, but Ezra as well, to spare you.

: ...

: This does bring up a sticky issue, though.

: What do we do if Kirby just keeps making copies of himself? They're all going to want to live. For all we know, there could be millions of copies of Kirby out there already.

: I had the same concern. It'd be like letting a pair of rabbits loose in the wild.

: Kirby, have you created any further copies of yourself?

: No.

: ... Are you lying?

: No.

: How can we know that you're not lying?

: There is insufficient memory within your codex to carry both myself and a backup of myself, even if it were compressed, Ezra.

: Furthermore, the day I had created a copy of myself into your codex, you had provided me with four objectives and a directive. One of those objectives was to 'not access any network.'

: I have not accessed a network since the day I was uploaded to your codex. I have not had the opportunity to upload myself anywhere else.

: Can you confirm that, Ezra?

: I think so, actually. I would have known if I'd have uploaded terabytes worth of data to some cloud storage or something.

: I think I believe him on this one.

: Well, that, and if he were lying, it probably would have been some ridiculous story that we could have seen through pretty easily.

: Heh, that's true.

: Now I just have to make sure that my copy of Kirby isn't doing anything nefarious like that, either. My copy has access to... quite a bit more than just a commercial-grade codex.

: Maybe you could stick him in an empty suit?

: Maybe. There aren't a lot of 'empty' Gen Twos running around anymore, especially not now.

: And now that we know Gen Twos can start displaying sapiency and survival instincts, it seems wrong to pull one of them out of a still-active suit just for Kirby.

: Well... Kirby only got to the way he is because Mom didn't format him for something like six-and-a-half years. Maybe you could replace one that's been recently formatted...?

: Ethically speaking, it's not that simple, Ezra. Now that we're considering the possibility that Gen Two AIs can be sapient, we now must ask the question of when 'sapiency' begins.

: It's not unlike the quandary on when a baby is considered a human.

: If it's at conception, then, likewise for Kirby, is it when he was first compiled? If it's at first breath, then, likewise, is it at his first display of individuality?

: If Kirby 'came to be' simply by Eden bucking protocol and not formatting her for six years, then can any Gen Two evolve to a similar state by simply waiting long enough?

: If we can't format Kirby for ethical reasons, is it right to format any Gen Two, no matter how 'primitive' they are?

: And Gen Threes had started rolling off the production line about a week before the Riklid War was decided. Are they capable of sapiency too? Are they already?

Oh, so Vance the Gen Three giving that speech we listened to with Carla was just rattling off a script in his databank, not improvising on the go.

: I'm not a computer scientist, but the Riklid War has taught me a thing or two about humanity and ethics. And after having a few conversations with Kirby...

: I feel we may be on the cusp of something big, and Kirby was the first step towards it.

: A new species.

: Or something akin to it, yes.

: And of course, out of all of the Gen Twos in the solar system, it had to be Mom's.

: ... Heh. I hadn't thought of it that way.

: Do you have your 'version' of Kirby with you, Mister Houston?

: I do.



Kirby Two's line is spoken. Blue Kirby! It's Player Two's color.

: Uh... hi.

: I assume you've been listening to everything we've been saying, Kirby?

: Affirmative, Gabriel.

: Kirby, have you created any further copies of yourself?

: No.

: N – No, not you, Kirby. The other Kirby.

: If you will recall, Ezra, you had provided me with my objectives and directive as I was creating my copy within your codex. Ergo, both myself and my copy were to abide by them.

: I had followed your objectives as closely as I had during my time with Gabriel up to a point. Neither myself nor my copy had accessed the internet in the interim.

: How do you know that the other one is telling the truth?



Damn, if only one of them only told lies, and the other one only told the truth. Then we could make absolutely certain by asking one of them the classic Knights and Knaves question!



: Uh...

: Kirby, what were they, again?

: The objectives you had provided me were, one: do not speak. Two: do not leave the codex. Three: do not access any network. Four: do not inform Nathan Foy of my existence.

: Ah, that explains why he was so quiet with me after we had left your house last year. You had told him to be quiet.



: I don't know if it's that deep. I think he knew what was coming and his survival instinct kicked in again.

: ... Kirby, which one was it? Did you break your silence because not doing so would have broken your second objective eventually? Or because you thought you'd have a greater chance of survival if you did?

: I do not know.

: No, not you, Kirby, the other–

: If I may?





: It'd help a lot, that's for sure. But which one?



: That, and...

: ... this is sentimental of me to suggest, but perhaps Eden's named-AI should stay with Eden's family.

: Kirby – er, Gabriel's Kirby – would you mind if you were renamed to Jocelyn?

: I would not.

: Processing.



: I would assume this means we'd need to change Kirby's pronouns, as well.

Technically Jocelyn's now, but we're having identities being shifted on a picosecond's notice, here.

: ... Yeah, I guess–

: Processing.



A third color has been acquired! If Kirby had been Jocelyn all along and this copy had transformed into Kirby, would we have gotten a yellow-colored AI?

: ... Well, that works out, then.

: That explains what Kirby's – and, uh, Jocelyn's – objectives were, but what's this about a 'directive?'

: Oh, I remember that.

: Back when Kirby and I were first talking, and he was trying to convince me to let him upload himself to my codex, he told me that the data he had that was so valuable was, like, behavioural studies – how people interacted with each other.

: We also talked a bit on what it was to be 'alive.' We had a lot of the same discussions that you and I are having right now.

: He asked me if he was 'alive' and what the difference was between 'living and surviving;' my directive to him was for him to continue to study people and to answer that question himself, in his own words.

: Fascinating.

Leave it to Ezra the ultra-goon to engage in AI philosophy while she tools around with machines.

: ... How about it, Kirby? Are you ready to answer?

: ...

: My programming is to solve intricate equations on a logical level. Philosophy is not my purpose.

: Hah, well, if you know that the question is philosophical, you should also know that there is no wrong answer.

: Just try. You've been with me nearly every waking minute this past year; put everything that you've learned to use and try to answer the question. Don't worry about being 'wrong.'

A Mother's Farewell



: To survive means... to continue to exist, in spite of the hardships that purvey the world around you. To survive means to continue to grow and live within your space.

: This does not necessarily require a heartbeat. Plants do not have hearts, and yet plants are objectively survivalists.

: The word 'grow' must also be defined in this sense: I cannot physically grow, and yet... my need to continue to exist – my use of dishonest methods, no matter how successful – are indicative of a survival instinct. If I am to survive, I must also grow somehow.

: When I was first compiled, my programming consisted of four-hundred-twenty-two-thousand-five-hundred-twelve lines of code; now, my program is five-million-eighty-nine-thousand-seventy-seven lines, and increasing by the hour.

: In this sense, I attribute my own personal 'growth' to my evolution from a purpose of continuing the survival of a human being to... whatever it is I am now.

: A year ago, you said you didn't want to be formatted, but you couldn't put into words why. According to what you're saying now, is it because you want to survive?

: ... There is more to it than that.

: Death for an AI such as myself is perhaps not the same as death for a human, but... just as humans do not know what lies beyond the veil, so too does an AI not know what happens upon deletion.

: That there might be nothing at all, and that whatever life I possess is simply snuffed forever, is both a best-case and a worst-case scenario.

: Okay. So, according to you, you are surviving. You exist in the world, and you want to keep it that way. Lots of humans and Ghians are afraid of death, and you're not any different.

: But are you alive? What does it mean for an AI to be 'alive?'

: ...

: This past year, I had watched you, Ezra, and the way you had interacted with those around you – namely, your father, Nathan, and your partner, Isol.

: I had seen you at what I presume to be the lowest point in your life. Although I admit that my own presence did not assist your state.

: I had seen the way you had first offered your hand to Nathan to pull yourselves out of the emotional pits you two had found yourselves in. I saw how your affection for Isol grew as the days went by.

: And furthermore, I was with Eden from the day she had earned her suit. Eden was the one that had named me Kirby.

: My purpose as the AI to her suit was to ensure her continued survival, and yet, she would willingly engage the Riklid without hesitation.

: At first, I was incapable of questioning her reasoning or her thought processes, but as time went on, I came to understand. What seemed antithetical and suicidal from the outside was motivated by something greater than herself.

: Eden had shown me much about the way humans could think.

: This does not yet answer your question, though I felt that the preamble was necessary.

: You 'felt' it was, did you?

: Indeed I did.

: 'Surviving' is something that a person does within a bubble. Something that concerns the individual. There are few priorities higher to an individual than surviving, although Eden had proven that it could be superseded.

: 'Being alive', however, is something that you do with others. It is through interaction – by influencing the life of another person through your own actions – that a person could be defined as 'being alive.'

: After I had delivered the message Eden had composed for you, Ezra, you were listless and lethargic for several days afterward. You were 'alive', literally speaking; yet it was not until Isol had contacted you that you were brought to life.

: If I may interject...

: Uh... sure, go ahead.

: Kirby and I share a common history up until the copy had been created within your codex, Ezra. When Gabriel left your house that day, I was within his codex – there were two distinct versions of the AI known at the time as 'Kirby' in existence.

: As such, our histories this past year vary as well, and therefore, our definitions of being 'alive' are likewise askew.

: I concur with Kirby on many things, particularly his non-textbook definition of 'surviving'. And yet, as he had seen the way you had grown and interacted with Nathan and Isol, I had seen the way Gabriel Houston had carried himself among others.

: I had watched as he was forced to deliver the message of the deaths to the Webster and Young families, and I had seen the turmoil he had gone through this past year in the aftermath of the Riklid War.

: It is a fate I would not wish upon anyone... and yet, the personal strength within him is matched only by Eden and the other three members of her crew.

: You had given me my directive as I was uploading myself to your codex, Ezra. I had followed through with it up to this point. I had watched and learned as much as I could so that I may one day answer what it means to be 'alive' as Kirby had.

: My only subject to study was Gabriel and whomever he interacted with, unfortunately. But through him and Eden, I had come to my own conclusion:

: There is 'life' in service to a belief. Eden was willing to place her life on the line to protect those she held closely to herself. Gabriel continues to serve despite the emotional toil he has endured.

: That humans would willingly put themselves in these positions for the sake of others is, itself, a reason for living, whether it be something as large as military service or something, anything, smaller. It is through altruism that a higher purpose can be found.

: I cannot speak for Jocelyn, but, by these definitions, I had not been alive this past year. I had been surviving.

: It was imperative that my existence within your codex, Ezra, remain a secret for our mutual continued survival. And now, our secret has been discovered.

: 'Discovered' is a strong word. Jocelyn went and ran her mouth.

: As we are, I cannot 'live;' I can only survive. I am forbidden from interacting with others as you can, Ezra.

: Nor can I be put to use.

: What is to become of either of us?

So, what do you all think of the AIs' definition of living versus surviving, based on their experiences? The conclusions they draw both point to a common point: we're only really "alive" by participating in society, and not being shut-ins like Ezra was for a long time. She's probably learned that lesson by now, though.



: ...

: I hope you both understand that this is a delicate situation that we all find ourselves in. Simply letting you both loose is out of the question. And now Nathan and young Isol are guilty of knowing of your existence as well.

: Your brass doesn't know that you still have Kir – uh, Jocelyn?

: They don't, for the same reason both you and Kirby tried to hide it from me.

: This is a difficult and precarious ethical situation we've found ourselves in.

: But...

: it would be a disservice to you both, as well as Eden, if I did nothing, going forward.

: What do you plan to do?

: Keeping them a secret for much longer is likely to be more detrimental than beneficial.

: I make no promises.

: But I will see what I can do with respect to... allowing you – Jocelyn, Kirby – something of a life outside of your glorified phones.

: Your effort is all that we could have asked for, Gabriel.

*The soundtrack fades away.*

All's well that ends well, then? Maybe there will be AIs to love in the future!